DTE AICCOMAS 2025

TORCHDA: Data assimilation with neural networks

  • Cheng, Sibo (École nationale des ponts et chaussées)
  • Min, Jingyang (Imperial College London)
  • Wang, Kun (Imperial College London)
  • Dance, Sarah (Reading University)
  • Piggot, Matthew (Imperial College London)
  • Bocquet, Marc (École nationale des ponts et chaussées)
  • Arcucci, Rossella (Imperial College London)

Please login to view abstract download link

Recently, significant efforts have been made to integrate Data Assimilation (DA) and Deep Learning (DL) approaches, with objectives including parameter calibration, reduced-order surrogate modeling, error covariance specification, and model error correction. We present our recent work on multi-variable field prediction using sparse observational data, enabled by the development of our new Python package, TorchDA, which seamlessly integrates machine learning predictive functions into DA workflows. In TorchDA, we implement Kalman Filter, Ensemble Kalman Filter (EnKF), 3D-Variational (3DVar), and 4D-Variational (4DVar) algorithms, allowing flexible algorithm selection based on application requirements. The main advantages of TorchDA, compared to existing Python-based DA packages, are twofold: (i) it can handle trained neural networks as forward and transformation functions, eliminating the need for an explicit transformation function, and (ii) it allows GPU acceleration for online optimization, particularly in variational DA methods. Numerical results will be demonstrated using two-dimensional CFD models, the open-access WeatherBench dataset, and a hydrology application.